|
Variance-based sensitivity analysis is a form of global sensitivity analysis.〔Saltelli, A., Ratto, M., Andres, T., Campolongo, F., Cariboni, J., Gatelli, D. Saisana, M., and Tarantola, S., 2008, ''Global Sensitivity Analysis. The Primer'', John Wiley & Sons.〕 Working within a probabilistic framework, it decomposes the variance of the output of the model or system into fractions which can be attributed to inputs or sets of inputs. For example, given a model with two inputs and one output, one might find that 70% of the output variance is caused by the variance in the first input, 20% by the variance in the second, and 10% due to Interactions between the two. These percentages are directly interpreted as measures of sensitivity. Variance-based measures of sensitivity are attractive because they measure sensitivity across the whole input space (i.e. it is a global method), they can deal with nonlinear responses, and they can measure the effect of interactions in non-additive systems.〔Saltelli, A., Annoni, P., 2010, How to avoid a perfunctory sensitivity analysis, ''Environmental Modeling and Software'' 25, 1508-1517.〕 ==Decomposition of Variance== From a black box perspective, any model may be viewed as a function ''Y''=''f''(X), where X is a vector of ''d'' uncertain model inputs , and ''Y'' is a chosen univariate model output (note that this approach examines scalar model outputs, but multiple outputs can be analysed by multiple independent sensitivity analyses). Furthermore, it will be assumed that the inputs are independently and uniformly distributed within the unit hypercube, i.e. for . This incurs no loss of generality because any input space can be transformed onto this unit hypercube. ''f''(X) may be decomposed in the following way,〔Sobol’, I. (1990). Sensitivity estimates for nonlinear mathematical models. ''Matematicheskoe Modelirovanie'' 2, 112–118. in Russian, translated in English in Sobol’ , I. (1993). Sensitivity analysis for non-linear mathematical models. ''Mathematical Modeling & Computational Experiment (Engl. Transl.)'', 1993, 1, 407–414.〕 : where ''f''0 is a constant and ''f''''i'' is a function of ''X''''i'', ''f''''ij'' a function of ''X''''i'' and ''X''''j'', etc. A condition of this decomposition is that, : i.e. all the terms in the functional decomposition are orthogonal. This leads to definitions of the terms of the functional decomposition in terms of conditional expected values, : : : From which it can be seen that ''f''''i'' is the effect of varying ''X''''i'' alone (known as the main effect of ''X''''i''), and ''f''''ij'' is the effect of varying ''X''''i'' and ''X''''j'' simultaneously, ''additional to the effect of their individual variations''. This is known as a second-order interaction. Higher-order terms have analogous definitions. Now, further assuming that the ''f''(X) is square-integrable, the functional decomposition may be squared and integrated to give, : Notice that the left hand side is equal to the variance of ''Y'', and the terms of the right hand side are variance terms, now decomposed with respect to sets of the ''X''''i''. This finally leads to the decomposition of variance expression, : where :, : and so on. The ''X''~''i'' notation indicates the set of all variables ''except'' ''X''''i''. The above variance decomposition shows how the variance of the model output can be decomposed into terms attributable to each input, as well as the interaction effects between them. Together, all terms sum to the total variance of the model output. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Variance-based sensitivity analysis」の詳細全文を読む スポンサード リンク
|